Discovering Hidden Variables in Noisy-Or Networks using Quartet Tests – Supplementary Material

نویسندگان

  • Yacine Jernite
  • Yoni Halpern
  • David Sontag
چکیده

Require: quartet (a, b, c, d), empirical moments M̂(a,b,c,d), threshold τq Ensure: boolean singly-coupled. 1: M(a,b) = unfold(M̂(a,b,c,d), (a, b), (c, d)) 2: M(a,c) = unfold(M̂(a,b,c,d), (a, c), (b, d)) 3: M(a,d) = unfold(M̂(a,b,c,d), (a, d), (b, c)) 4: for u ∈ {b, c, d} do 5: λ3,(a,u) ← third eigenvalue of M(a,u) 6: end for 7: if maxu(λ3,(a,u)) > τq then 8: return False 9: end if 10: return True Algorithm 2 QUARTET: Coherence test

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Discovering Hidden Variables in Noisy-Or Networks using Quartet Tests

We give a polynomial-time algorithm for provably learning the structure and parameters of bipartite noisy-or Bayesian networks of binary variables where the top layer is completely hidden. Unsupervised learning of these models is a form of discrete factor analysis, enabling the discovery of hidden variables and their causal relationships with observed data. We obtain an efficient learning algor...

متن کامل

Unfolding Latent Tree Structures using 4th Order Tensors

Discovering the latent structure from many observed variables is an important yet challenging learning task. Existing approaches for discovering latent structures often require the unknown number of hidden states as an input. In this paper, we propose a quartet based approach which is agnostic to this number. The key contribution is a novel rank characterization of the tensor associated with th...

متن کامل

Generalized Ideal Parent (GIP): Discovering non-Gaussian Hidden Variables

A formidable challenge in uncertainty modeling in general, and when learning Bayesian networks in particular, is the discovery of unknown hidden variables. Few works that tackle this task are typically limited to discrete or Gaussian domains, or to tree structures. We propose a novel approach for discovering hidden variables in flexible nonGaussian domains using the powerful class of Gaussian c...

متن کامل

Quartet-Based Learning of Hierarchical Latent Class Models: Discovery of Shallow Latent Variables

Hierarchical latent class (HLC) models are tree-structured Bayesian networks where leaf nodes are observed while internal nodes are hidden. The currently most efficient algorithm for learning HLC models can deal with only a few dozen observed variables. While this is sufficient for some applications, more efficient algorithms are needed for domains with, e.g., hundreds of observed variables. Wi...

متن کامل

Quartet-Based Learning of Shallow Latent Variables

Hierarchical latent class(HLC) models are tree-structured Bayesian networks where leaf nodes are observed while internal nodes are hidden. We explore the following two-stage approach for learning HLC models: One first identifies the shallow latent variables – latent variables adjacent to observed variables – and then determines the structure among the shallow and possibly some other “deep” late...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013